专利摘要:
The invention proposes a tactile interface comprising a screen able to detect the approach and the position of a user's finger, the interface being configured to display on the screen at least a first graphic element (Ci_0) superimposed on a first touch-sensitive selection area and being in a first region, and for displaying at least one second graphical element (Cj_0) superimposed on a second touch-selection zone, and located in a second region of the screen distinct from the first region, the interface being configured to estimate a trajectory of a point of the finger and the point of impact (Pxy (t)) of this trajectory on the screen, and being configured for, when the point of impact (Pxy (t)) enters one of the regions, moving the graphical element (Ci (t)) of this region and the associated touch selection zone towards the point of impact (Pxy (t)), then, when the impact point comes out of the area, bring back the display of the lement graph to its initial state (Ci_0).
公开号:FR3028968A1
申请号:FR1461287
申请日:2014-11-21
公开日:2016-05-27
发明作者:Stephane Regnier
申请人:Renault SAS;
IPC主号:
专利说明:

[0001] 1 Graphic interface and management method of the graphic interface during the tactile selection of a displayed element The invention relates to tactile interfaces, in particular touch interfaces embedded on board motor vehicles, or tactile interfaces used to control systems on which a user intervenes while being forced to keep his attention focused on other tasks, for example on the monitoring of a production machine. In such configurations, the user must interact with the touch interface while keeping his or her attention largely available for other tasks than operating the interface, and, if the interface is not very large, The user may have difficulty selecting the menu item. To select such a menu item, it must apply its finger to a given location of the interface corresponding to a touch selection area of the menu item, highlighted on the screen by an icon or more generally a graphic symbol displayed on the screen substantially at the location of the touch selection area. In particular, in a motor vehicle, when the vehicle is in motion, the user's selection gestures, which must constantly monitor the road, may be imprecise. In order to overcome these disadvantages, some manufacturers of mobile terminals have developed large screens or text input systems, in which for example one graphically grows a letter touched by the finger. This magnified display, remotely distanced from the place where the icon to be activated and the finger, is maintained for a short period of time, however long enough for the user to read the letter he has entered, doing a visual check to make sure he made the entry he wanted. This type of display always makes it necessary to make the entry on a restricted area and different from the screen with each new letter. The object of the invention is to propose a human / machine interface system making it possible to reduce the occurrences of error in the input of the elements of a menu by making it easier for the user to enter the desired graphic element without necessarily increase the size of the screen, and leaving more margin in the gestures to perform to make the entry. To this end, the invention provides a graphical interface or touch interface comprising a screen capable of detecting the approach and position of a user's finger. The detection is preferably within a predefined volume, characterized in particular by a detection threshold distance from the screen. Detection is therefore at least in a predefined volume, but could extend to a wider space. The interface is configured to display on the screen at least one first graphical element superimposed on a first touch selection zone and located in a first region, and to display at least one second graphical element superimposed on a second tactile selection zone , and being in a second region of the screen distinct from the first region. The interface is configured to estimate a trajectory of a point of the finger and the point of impact of that trajectory on the screen, and is configured for, when the point of impact is detected in one of the regions or traveling to within one of these regions, move the graphic element of that region and the associated touch selection area towards the point of impact, then, when the point of impact exits the region, bring the display of the graphic element to its initial state. The direction of movement of the graphic element can be defined by the displacement of a particular point of the graphic element, which is subsequently designated in the description by anchor point in its initial position, and by centering point when moved. Here, the term "screen" of the tactile interface, or graphical interface, denotes three two-dimensional space regions that are superimposable on each other, with the possible calculation of a change of coordinates managed by an electronic control unit. managing the detection operations performed by the touch interface as well as the displays on the screen of the touch interface.
[0002] The first of these three regions is constituted by the actual display screen for displaying the graphic elements to the user for indicating the regions of the space with which he must interact. The second region is a touch interface detection unit or graphic interface, associated with a sensitive flat touch screen surface, superimposed on the display screen, or associated with another space sensing system. allowing in particular to detect the position of the finger of the user in the vicinity of the display screen, then identified in the proper coordinates of the detection interface 15. The third region is defined by datum coordinate values of a virtual reference screen, stored by the electronic control unit and grouped by regions of the virtual screen, expressed in a coordinate system specific to the control unit. Electronic control. These regions of the virtual screen are for example defined by surface areas or sets of boundary lines. With respect to these regions, the initial anchor points of the graphical elements are memorized, then their subsequent centering points are calculated, and are also calculated at each instant the coordinates of the other points of each graphical element which can then be translated into the coordinate system of the display screen. The position of the finger in the coordinates of the detection interface can be for example translated into the coordinate system of the electronic control unit to identify its position with respect to the different boundaries, and then to calculate over time the positions. centering points of the displayed graphic elements, which are then translated into positions on the display screen.
[0003] Preferably, the first region is delimited by a first boundary, the second region is delimited by a second boundary which has a border portion common to the first boundary. The respective graphical elements and touch zones surround in an initial state, each the associated anchor, and are then within the region associated with them. The boundaries of each region remain fixed during the display of a given selection menu, during the interaction between the finger and the interface, until possibly a touch of the finger on the screen triggers. displaying another selection menu. The touch actuating finger may be the finger of an operator or may be an element detectable by the interface, for example an elongated object such as a stylus adapted to allow the detection of a particular geometrical point by the touch interface . By point of impact or point of contact means the intersection of the trajectory estimated by the interface at a given moment, even if the finger does not touch the screen at this point at this time nor later during the interaction with the interface. The display of the translated graphic element at a given time is substituted for the initial display of the graphic element and the other previous displays, if any, of the graphic element. When the point of impact is detected in the first region, the interface may be configured to calculate a translation vector between a first anchor point belonging to the first non-moved graphical element and a temporary centering point of the first graphical element. moved, the centering point being a barycentre between the anchor point and the point of impact, and to perform a corresponding translation of the first graphical element and the associated touch selection zone 30, the relative distance between the centering point and the point of impact being calculated as an increasing function of the distance between the finger and the screen. Barycenter is here understood to mean a barycentre weighted between two points with weighting coefficients that may be variable functions, for example a function of the distance between the finger and the screen. By temporary centering point is meant a point of the screen on which is centered the graphical element during at least some 5 phases of interaction of the interface with the finger. Centering is understood here in the broad sense of the term, the centering point being for example a surface barycentre, or a barycentre of certain characteristic points of the graphic element, the weighting coefficients of this barycentre being constant but not necessarily 10 equal from one characteristic point to another. The representation of the graphic element around the temporary centering point is then a homothety or a bidirectional expansion of the initial representation of the graphic element around its anchor point. In a preferred embodiment, the anchor point and the temporary centering point are not apparent on the displayed graphical element. The ratio of the homothety or the ratios of the expansion are preferably greater than or equal to 1 as soon as the centering point no longer coincides with the initial anchor point. The temporary centering point is located between the anchor point and the impact point. The anchor point of the graphic element is a particular point of the screen associated with this graphic element, preferably contained within the boundaries of the graphic element for a reference display state of the interface . A reference display state corresponds, for example, to the display of a particular selection menu in the absence of an interaction in progress with the finger. The anchor point may typically be a geometric center of a surface or contour defining the visible boundaries of the graphic element. According to some embodiments, the anchor point of the graphic element may however be eccentric with respect to the graphic element, for example may be offset relative to a geometric center of the element, to an edge of the element. screen whose graphic element is the closest, to limit the risk of overlap between the graphical element moved and the edge of the screen.
[0004] Preferably, the function for calculating the relative distance between the center point and the point of impact is canceled when the finger touches the screen, that is to say when the distance between the finger and the screen is canceled. In other words, the ratio between the center point distance at the point of impact and the distance between the anchor point and the point of impact decreases when the finger approaches the screen, and is canceled when the finger touch the screen. According to a first variant embodiment, this ratio, in other words, this relative approach distance, does not depend on the distance between the point of impact and the anchoring point. According to another variant embodiment, this relative distance may be decreasing with the distance between the point of impact and the anchor point, for example if the graphic element is of reduced size compared to the distance separating two graphic elements. Thus the finger can be positioned above the graphic element 15 even when the latter has been deported near the area of influence of another graphic element. According to another variant embodiment, this relative distance may be increasing with the distance between the point of impact and the anchor point, for example if the graphic element is of comparable size to the distance separating two graphic elements. This avoids the graphic element "displaced" encroaches unduly on the zones of influence of neighboring graphic elements. The first region may be delimited by a first boundary and the second region may be delimited by a second boundary having a boundary portion common to the first boundary, and the interface may be configured to, when the point of impact crosses the boundary. common border portion of the first region to the second region, displaying at least temporarily both the first graphical element and the second graphical element out of their initial positions, respectively at a first intermediate position and a second intermediate position between the initial position of the graphic element and the point of impact. After crossing the common border portion, the interface may be configured to display the first graphical element at successive positions along a line joining the first anchor point, which are separated by first intervals of which the length is a first increasing function of the distance from the point of impact to the common border portion.
[0005] The interface may be configured to calculate the first intervals using a first function which is furthermore an increasing function of the distance of the finger on the screen. The interface may be configured to, after crossing the common border portion, further display the second graphical element at successive positions along a line joining the point of impact, which are separated by second intervals. each second interval length being calculated from the length of the first interval used for displaying the first graphical element at the same time, multiplying that first interval by a constant multiplier coefficient. The constant multiplier coefficient may be less than the ratio of a first distance from the first anchor point to the common boundary portion, and a second distance from the second anchor point to the common boundary portion.
[0006] The multiplying coefficient may for example be between 0.3 and 0.7 times the ratio between the first and the second distance. For example if the multiplying coefficient is equal to 0.5 times the ratio between the first and the second distance, then if the hesitant user stops his finger just after crossing the common border portion, the first graphical element does not return to not towards the first anchor element, while the second graphic element approaches the point of impact. However, when the first graphical element reaches the first anchor point, the second graphical element is only halfway to a displacement towards the point of impact. If the user remains immobile beyond this completed return time of the first graphical element, the second graphical element can then be moved to the impact point at a distance from the second anchor point that is computed by a vector of the second graphical element. relocation again depending on the distance of the second anchor point at the point of impact, instead of being calculated incrementally. The process triggered when crossing the common boundary portion by the point of impact, and symmetrical following that a displacement of the first graphical element 5 or a displacement of the second graphical element was triggered first. The interface may be configured to display a translated graphical element as long as a graphical element is displayed out of its original position by expanding this graphical element, along at least one direction, by a magnification factor. The expansion may correspond to a bidirectional homothety, but may, in certain embodiments, correspond to an expansion with two different ratios along two perpendicular axes of the screen or to a unidirectional expansion. For example, if the graphic element is near a display edge of the screen, the graphical element may be further expanded, or expanded only in the direction perpendicular to that edge, to delay the moment when the graphical element overlaps the edge of the screen if the finger approaches this edge. The magnification factor, i.e. the ratio of the homothety or the highest ratio of the bidirectional expansion, is preferably between 1.1 and 1.5, and preferably between 1, 15 and 1.35. The graphic element may alternatively, or in addition to the change in size, be highlighted by a change in brightness, contrast, color, filling pattern, evolution of the graphic evoking a rapprochement or a relief. of the graphic element. The interface may be configured to allow at least a finger contact select at a point in the first touch selection area, while the first touch select area temporarily overlaps the second region and this point is within the second region. The invention further proposes a method for managing a touch interface capable of detecting the approach and position of a user with respect to a screen of the interface, in which: a display is displayed at a first step on the screen at least one first graphic element associated with a first touch selection zone, both surrounding a first anchor point of the graphic element on the screen, and being within a same first 5 region; in this first step on the screen, at least one second graphic element associated with a second touch selection zone is displayed, both surrounding a second anchor point of the graphic element on the screen, and located at within the same second region; we repeatedly estimate a trajectory of a point of the finger and the point of impact of this trajectory on the screen, when the point of impact enters the first region, we move the graphical element displayed and the associated touch selection zone 15 towards the point of impact; if the point of impact crosses a portion of the common border between the first and the second region, incrementally the display of the first graphical element is brought back to the first anchoring point, while the second element is brought incrementally together; graph to the point of impact. The invention can of course be applied to modes of selection by position of the finger on a reconfigurable display area, even if the selection is not made in a tactile way in the primary sense of the term: it can for example be applied to a menu temporarily projected by optical means for example on a surface of a windshield of a motor vehicle, the interface comprising means for analyzing the position of the actuating finger for example from camera entries positioned near from the surface.
[0007] Other objects, features and advantages of the invention will be apparent from the following description, given by way of non-limiting example, and with reference to the accompanying drawings, in which: FIG. a motor vehicle equipped with an interface according to the invention, - Figure 2 illustrates a man-machine interface according to the invention; - Figure 3 is a characteristic graph of one of the operating modes of the interface of the FIG. 2, and FIG. 4 illustrates a portion of the interface of FIG. 2 during a particular selection by the user of the interface. As illustrated in FIG. 1, a tactile interface 1 according to the invention may for example be embedded on board a motor vehicle 3 driven by a user 4 who, by moving his finger and touching certain points of a A touch-screen interface 1 is thus able to issue instructions to an electronic control unit 2 for actuating different equipment of the vehicle, for example a ventilation system 5 of the vehicle or any other equipment of the vehicle. The electronic control unit 2 can also send to the touch interface 1 messages reflecting the operating status of the vehicle 3 so that the user 4 of the vehicle 20 can take into account these data. Figure 2 illustrates the operating principle of a touch interface 1 according to the invention. The touch interface 1 typically comprises a touch screen 6, delimited by edges 7 and a detection system (not shown) for detecting the position in a three-dimensional space of a finger 11 of a user, in particular a particular point Dxy, of this finger, and for detecting whether or not this finger is in contact with the touch screen 6. By touch screen is meant here any capture system by moving a finger and approaching this finger of a validation surface.
[0008] The invention may for example be applied to detection systems optically projecting information on an inert surface and observing the neighboring volume of this surface by means of various optical or infrared sensors for example, so as to detect the 3028968 11 position of a finger and so as to detect whether or not the finger is in contact with the surface. A virtual reference screen, the representation of which is here superimposed on that of the touch screen 6, is typically delimited by means of boundaries designated here by F1 4, F1 2, F4 1, F2 1, F2 3, F F3. i in regions or areas of influence, which are in Figure 2 referenced R1, R2, R3, R4, R1 and Rj. Each region corresponds to a selection zone of a menu displayed on the virtual screen corresponding to the touch screen 6. The regions liable to give rise to a validation act each display a graphic element designated by the general reference 8, and more particularly referenced according to the regions C1 0, C2 0, C3 0, C4 0, CI 0, Ci O. These graphic elements having a second index "0" correspond to an initial display of the menu on the touch screen 6. The The detection system of the graphical interface is configured to detect the movement of the finger 11 and in particular of an end 1), (3,2 of this finger which at a time t is at the point Dxyz (t) and at the instant following t + dt is at a point Dxyz (t + dt) The electronic control unit 2 of the graphic interface 20 is able to determine, for example by extrapolation of the detected successive points, a trajectory which is re-estimated every moment and who is noted traj (t) in Figure 2 for the estimated trajectory at time t, and which is noted traj (t + dt) for the estimated trajectory at time t + dt. Each of these calculated trajectories defines a point Px), which is here designated by point of impact or point of contact, although the contact or the impact remains theoretical, the point of contact being a calculated point. as the likely target of the user. The point of contact Pxy is in reality the intersection of the trajectory and a contact surface of the screen which may coincide with the display surface of the screen 6. The invention proposes, when the point d the impact is at a distance less than a threshold distance of one of the graphical elements C1 0, C2 0, C3 0, C4 0, Ci 0, Ci 0, to modify the display of the graphical element and to bring the l the graphical element of the point of impact, in order to help the user who can thus continue to validate the corresponding option of the menu without deporting his finger from the current trajectory. To do this, arbitrary definition for each graphic element to be displayed, a virtual anchor point 9 in the coordinates 5 of the virtual screen, this anchor serving both to estimate the distance between the graphic element and the point of impact and to calculate the subsequent displacements of the display of the graphic element. In FIG. 2, these anchoring points 9 are denoted respectively by the references B1 0, for the graphical element C1 0, 10 B2 0 for the graphical element C2 0, ... Bi 0 for the graphical element Ci 0. These anchoring points may, for convenience, correspond to a surface barycentre of the graphic element, or to a barycenter of an outline of the graphic element. According to an alternative embodiment, they may optionally be arbitrarily located near one of the boundaries of the graphic element. To determine whether the display of the graphical element C1 0, is to be moved, the distance Ecarti (t), can be compared either to a constant threshold or to a threshold which depends on the direction of the line connecting the point anchor B1 0 and the point of contact Pxy (t). For example, it can be verified whether the point of contact is within the boundaries delimiting the region in which the graphic element in question is located in its initial state in the absence of interaction with the finger. In FIG. 2, the distance at an instant t from the point of contact 25 Pxy (t) of the trajectory traj (t) calculated at this instant t, and from the anchor point B1 0 is denoted Ecarti (t). As a function of this distance Ecarti (t), and possibly depending on the distance of the finger on the screen, we apply to the graphical element C1 0 a displacement noted here Ui (t), which at a given instant corresponds to a vector joining the anchor point B1 0 and a temporary centering point Bi (t). The temporary centering point Bi (t) occupies, with respect to the displaced graphical element Ci (t), the same barycentric position that the anchor point B1 0 initially occupies with respect to the graphic element C1 0 in its configuration. initial display.
[0009] At a later time t + dt, the re-calculated trajectory defines a new impact point Pxy (t + dt) whose position is used together with the distance of the finger on the screen to calculate a new position of the point. centering Bi (t + dt) of the graphic element Ci (t + dt) displayed at that time. In order to improve the perception by the user of the graphic element that is about to be selected, it is possible, as soon as the displacement of the display of the graphic element is activated, to accompany this displacement of a expansion of the dimensions of the graphic element, for example a homothety in all directions or possibly, depending on the space available on the screen, an expansion in one of the directions of the screen. The size of the graphical element can then be kept constant as long as the displacement of the display of the graphical element continues to be effective. Depending on the size of the graphical element and the amplitude of displacement Ui (t), it may happen that the graphical element overlaps one of the boundaries between regions. For example, in FIG. 2, the graphic element Ci (t + dt) is about to overlap the F1 border 2. Each graphic element is associated with a touch selection zone, which when touched by the finger the user triggers an action corresponding to one of the menu options displayed on the screen. Preferably, the touch selection zone coincides with the area occupied by the graphic element.
[0010] The interface according to the invention can be configured so that, if following the trajectory traj (t) performed by the finger, the graphic element and the associated tactile selection area overlap one of the borders and that the finger arrives in contact with the screen at a point of the graphic element displayed at this time, a validation on the associated touch selection zone is then taken into account, even if the contact point is at this moment beyond the border of the region associated with the graphic element. In this way, the input of the user is facilitated, since in some way the effective border of the admitted region for selecting a menu item is deformed to a certain extent depending on the user's finger trajectory. , so as to widen the total selection area admitted by temporarily moving its boundaries.
[0011] FIG. 3 illustrates an example of a graph 20 connecting the amplitude of displacement here noted Ui (t) of a graphic element on an interface according to the invention, as a function of a distance h (t) between a finger of the the user and the screen, and a distance Ecarti (t) between the point of impact of the trajectory of the finger and the initial anchor of the graphical element. The mapped surface 21 is here chosen to cancel any displacement of the graphic element when the distance of the finger on the screen exceeds a certain threshold ho, which can typically be the detection threshold distance of the touch interface. The mapped surface 21 is also chosen to cancel any displacement of the graphic element when the point of impact approaches the anchor point, since there is no longer any need to move the graphic element. Typically, the displacement value U i (t) can be chosen as a product between the distance Ecarti (t) and a function which is chosen as a function increasing the distance of the finger on the screen, a function which is canceled out for a threshold value ho of distance. One of the possible forms of functions for defining the displacement vector Ui (t) is to directly multiply the distance Ecarti (t) by a concave or convex function of the distance h from the finger to the screen. This concave or convex function can be for example a power of the difference to 1, a ratio between the distance h of the finger and the threshold distance h 0. If we choose for this function a power 1/2, we get the expression proposed in equation (1) and corresponding to the graph of FIG. 3: U i (t) = Dist (81 0, 13, (t )) x V (1 h (t) I h0) = deviation, [(t) x 1 (1 h (t) 1 h0) Equation (1) 3028968 The advantage of choosing such a form of convex function is that there is a "slowing" effect of the displacement of the graphic element when the finger is in the immediate vicinity of the screen, which avoids disturbing the user before the final selection. Another variant of function U (t) may be envisaged, in which a power function is also applied to the distance Ecarti (t) between the anchor point and the point of impact, so as to slow the movement of the graphical element when approaching the boundaries of the region associated with the graphical element under consideration. The distance from the finger to the screen can be taken as an orthogonal distance h from the finger to the screen, as represented in FIGS. 2 and 3. According to a variant embodiment not shown, the distance of the finger from the screen is taken as the distance between the point 1), (3 ,, of the closest finger 15 of the screen and the point of impact Pxy (t) of the trajectory at this instant. According to another variant, it is possible to define a distance relative between the centering point Bi (t) of the graphic element, and the point of impact Pxy (t) as a distance ratio Ai (t), defined as: 200 (t) = deviation (t) -U1 (t) l Ecartl (t) Equation (2) This relative distance gives the remaining gap to be traveled to the graphical element so that this graphical element is centered around the point of impact.
[0012] This relative distance decreases as the finger approaches the screen and vanishes when the finger touches the screen. When the graphical element straddles the boundary of one of the regions, two hypotheses should be considered. In a first case, the validation and menu item corresponding to the graphical element, occurs while the displayed displayed graphical element is still entirely in the region associated with it. The interface then generally switches to a new display screen to confirm the selection, or to propose another menu. It is also possible that the user's finger continues to move towards the border region, either because of the vagaries of the vehicle's movements or because, due to an insufficiently precise selection, it is the graphic located on the other side of the border region that the user actually wants to operate. The invention then proposes a transition display approach making it possible to indicate to the user that he is in a border zone between two validations and to enable him to immediately visualize which graphic element is on the way. to be dropped for selection, and what is the graphical element that is being put forward with regard to the user's finger trajectory.
[0013] FIG. 4 illustrates a particular example in which the user initially initiated the displacement of the display of a graphical element C, which has approached a boundary F, between a region i and a region j of the screen 6 of the interface 1 illustrated in FIG. 2. Following the movement made by the user's finger, the point of impact 13'y (t) which was initially in the region i, is passed through the region j, and is at a distance here DF (t) at the border F, between the two regions. The point of impact 13'y (t) is at a distance noted here Ecarti (t) of the anchor point 13,,) of the graphic element C, of the region i and is at a distance Ecarti (t) anchor point 13, () of the graphic element Cj associated with the region j. At time t, the graphic element C, (t) already overlaps the boundary F, and the graphic element C, is still at its initial configuration C, 0, since the theoretical point of impact of the finger comes all just when crossing the border F, J. Once the crossing of the boundary has been detected by the interface 1, it activates a progressive and simultaneous displacement of the two graphical elements of the regions R, and R, so as to bring the graphical element of the region R back to its initial position - in the absence of interaction - and begin to move the graphic element of the region Ri towards the point of impact 13'y ( t). In the following instants, the graphic element C, is each time moved along the line connecting the impact element Px), and the anchor point Bi 0, of a first interval 81 which is a function of the distance h of the finger on the screen and is a function of the distance DF from the point of impact 13'y (t) to the boundary F, At the same time, the graphic element Ci is moved along the line connecting the anchor point Bi 0 to the point of impact 13'y, a second interval 82, which is also a function of the distance h of the finger on the screen and the distance DF between the point impact and the common border. The values of the intervals 8, and 82 are chosen so that, if the hesitant user keeps his finger fixed in the position Pxy detected after the crossing of the border, then the graphic element C, returns to its initial position, and graphical element Ci has traveled at the same time a portion of the distance which separates it from the point of impact. The user then sees that the graphic element Ci is selected for its activation since it has begun its movement. To obtain this effect, it is possible, for example, to calculate the values of the intervals 8, as values proportional to a first function of the distance of the finger on the screen. This first function vanishes for the detection threshold of the screen, and is increasing according to the distance of the finger on the screen. In one embodiment, a constant ratio between the first interval 8, -displacement of the graphic element being deselected, and the second interval 82 -displacement of the graphic element being preselected can be introduced further. . This ratio may for example be a function of the distances between the respective anchor points of the graphic elements and the point of impact, in other words a ratio which is equal to the ratio between the distances of the respective anchor points and the common border F, Thus, Dist, the distance from the anchor point of the first graphical element to the common border, and Disti distance of the anchor point of the second graphical element to the common border, can be used expressions of the type: O, (h, DF) = δt (h, DF) = K DF (t) x / (1 h (t) lho) Dist 1 5 Equation (3) 8 2 (h, DF) = 8 d (h , DF) - k DF (t) x V (1 h (t) I ho) Dist Equation (4) 10 Preferably: -k 1 The values k and K are constant coefficients, which are chosen in such a way that the ratio k / K is less than 1, for example equal to 1/2. In this way, when the first graphical element C, 15 will have returned to its initial position C, 0, the second graphical element Ci will be halfway between its initial position Ci o and the point of impact 13'y. In the example illustrated in FIG. 4, the positions of the graphical elements C 1 and C 1 are represented, and their displacements on three time intervals of amplitude dt, after a time t when the boundary has been crossed by the point d P. Thus we see that the graphic element C, (t + 3dt) has taken three steps back, of value 8, towards its initial position, while the graphic element Ci has taken three steps forward of amplitude 82 towards the point of impact P. The amplitude of the pitch 8 is greater than the amplitude of the pitch 82 because the distances to the common border F, anchor points 131 0 and Bi 0 are comparable. As an illustration of another embodiment, FIG. 4 also shows a graphical element position C'i (t) which could correspond to the position of the graphical element Ci if a particular procedure were not possible. not planned when crossing the border 3028968 19 E, j. This theoretical position is calculated according to the same scheme as for the displayed position of the graphic element C, on the other side of the border. In the absence of a transient display procedure at border crossings, the two elements would all be very close to the same Pxy point, which could lead to confusion in the perception of the user, who would have difficulty in find out what is the previously selected graphic element and which one has just been activated. The procedure of gradually erasing the activated element according to the invention makes it possible to improve the perception of the chronology of the elements selected by the user. By convention, the distance between the point of impact and the graphical element is measured, such as the distance between the point of impact and a centering point obtained by applying to the initial centering point, the translation vector of the point of impact. graphic element. The relationship between the proximity of the finger to the screen and the proximity between the point of impact and the displaced equivalent of the anchor point is not necessarily linear. The invention is not limited to the exemplary embodiments described and can be broken down into numerous variants. The position of the finger relative to the interface can be detected by any tactile means or any selection means by positioning the end of a finger. The functions for calculating the displacement of a graphic element may differ from those cited in the examples. Timers may be introduced at certain stages of the modified display process of the at least one graphical element. Modes for transparently displaying one graphical element relative to another may be provided if two graphical elements are to be displayed on interfering zones.
权利要求:
Claims (10)
[0001]
REVENDICATIONS1. Touch interface (1) comprising a display screen (6), the interface being able to detect the approach and the position of a finger (11) of a user relative to the screen, the interface being configured to display on the screen at least one first graphic element (C, 0) associated with a first tactile selection zone and located in a first region, and to display at least one second graphic element (C, o) superimposed on a second touch selection zone, and being in a second region of the screen distinct from the first region, the interface being configured to estimate a trajectory (traj (t)) of a point of the finger and the point of impact (P'y (t)) of this trajectory on the screen, and being configured for, when the point of impact (P'y (t)) is detected in one of the regions, moving the graphic element (C , (t)) of this region and the associated touch selection zone towards the point of impact (P'y (t)), and then, when the point of impact leaves the region, return the display of the graphic element to its initial state (C, 0).
[0002]
Touch interface according to claim 1, configured for, when the point of impact (P'y (t)) enters the first region, calculating a translation vector (Ui (t)) between a first anchor point (B, 0) belonging to the first graphical element, (Co) not displaced and a temporary centering point (B, (t)) of the first graphical element displaced (C, (t)), the centering point (B, ( t)) forming a barycentre between the anchor point (B, 0) and the point of impact (P'y (t), and to perform a corresponding translation of the first graphical element (C, (t)) and the associated touch selection area, the relative distance (Ai (t)) between the center point and the point of impact being calculated as an increasing function of the distance (h) between the finger (11) and the screen (6). ).
[0003]
A touch interface according to claim 1 or 2, wherein the first region is delimited by a first boundary and the second region is delimited by a second boundary having a common boundary portion (F) with the first boundary. interface being configured for, when the point of impact (P'y (t)) crosses the common border portion of the first region to the second region, displaying at least temporarily both the first graphical element (C, (t + 3dt)) and the second graphical element 5 (C, (t + 3dt)) out of their initial positions, respectively at a first intermediate position and a second intermediate position between the initial position of the graphical element (B, 0). , BJ 0) and the point of impact.
[0004]
The touch interface of claim 3, configured to, after crossing the common border portion (F,), display the first graphical element at successive positions along a line joining the first anchor point. , which are separated by first intervals (si) whose length is a first increasing function of the distance (DF (t)) from the point of impact to the common boundary portion.
[0005]
5. A touch interface according to claim 4, configured to calculate the first intervals (si) using a first function which is furthermore an increasing function of the distance (h) of the finger on the screen (6). 20
[0006]
The touch interface according to claim 5, configured to, after crossing the common border portion (F,), further display the second graphical element (C, (t)) at successive positions along a line joining the point of impact, which are separated by second intervals (82), each second interval length being calculated from the length (si) of the first interval used for displaying the first graphical element at the same time , multiplying this first interval by a constant multiplier coefficient.
[0007]
The interface of claim 6, wherein the constant multiplier coefficient is less than the ratio of a first distance (Dist,,) from the first anchor point to the common border portion, and a second distance (Dist,,) from the second anchor point to the common boundary portion (F,,). 3028968 22
[0008]
8. Touch interface according to one of claims 1 to 7, configured, as long as a graphical element is displayed out of its original position, to display the graphical element translated by dilating the graphical element, along at least one direction, d a magnification factor.
[0009]
The interface according to one of claims 1 to 8, configured to allow at least a finger-select selection (11) at a point in the first touch-sensitive selection area, while the first touch-sensitive selection area. temporarily overlaps the second region and this point is in the second region.
[0010]
10. A method for managing a touch interface (1) capable of detecting the approach and the position of a finger (11) of a user with respect to a screen of the interface, in which: in a first step on the screen at least a first graphic element (C, 0) associated with a first touch selection zone, surrounding a first anchor point (C, o) of the graphic element on the screen, and being within the same first region; In this first step on the screen, at least one second graphic element (C, o) is displayed associated with a second tactile selection zone, surrounding a second anchor point (13, 0) of the graphic element on the screen, and being within the same second region; It is repeatedly estimated a trajectory (traj (t)) of a point of the finger and the point of impact (13'y (t)) of this trajectory on the screen, - when the point of impact enters the first region, moves the displayed graphical element (C, (t)) and the associated tactile selection zone towards the point of impact (P'y (t)); if the point of impact (P'y (t)) crosses a portion (F,) of common border between the first and the second regions, the display of the first graphical element (C, (t) is incrementally reduced; ) to the first anchor point (13, 0), while the second graphic element (Ci (t)) is incrementally brought closer to the point of impact (13'y (t)).
类似技术:
公开号 | 公开日 | 专利标题
FR3028968A1|2016-05-27|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT
US20190294241A1|2019-09-26|Method and apparatus for user interface using gaze interaction
WO2016079433A1|2016-05-26|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element
FR2971066A1|2012-08-03|THREE-DIMENSIONAL MAN-MACHINE INTERFACE.
FR2851347A1|2004-08-20|Man-machine interface device for use in vehicle control panel, has touch screen moved with respect to case by actuator based on displacement patterns controlled by analysis and treatment unit to produce touch differential effects
RU2017102195A|2018-07-24|LOOKING BASED PLACEMENT OF OBJECTS IN A VIRTUAL REALITY
CN104345802A|2015-02-11|Method and device for controlling a near eye display
WO2003073254A2|2003-09-04|A method of providing a display for a gui
EP2981879B1|2017-06-14|Device for contactless interaction with an electronic and/or computer apparatus, and apparatus equipped with such a device
EP2956846B1|2020-03-25|Method, device and storage medium for navigating in a display screen
WO2015003544A1|2015-01-15|Method and device for refocusing multiple depth intervals, and electronic device
FR3005173A1|2014-10-31|INTERACTION METHOD IN AN AIRCRAFT COCKPIT BETWEEN A PILOT AND ITS ENVIRONMENT
FR2998071A1|2014-05-16|METHOD FOR SECURING A CONTROL ON A TOUCH-SURFACE VISUALIZATION DEVICE AND SYSTEM THEREOF
FR3036476A1|2016-11-25|AIRCRAFT FLIGHT INFORMATION VISUALIZATION SYSTEM AND ASSOCIATED METHOD
EP3355277A1|2018-08-01|Method for displaying on a screen at least one representation of an object, related computer program, electronic display device and apparatus
FR3036511A1|2016-11-25|AIRCRAFT FLIGHT INFORMATION VISUALIZATION SYSTEM AND ASSOCIATED METHOD
CN107335218B|2021-02-19|Game scene moving method and device, storage medium, processor and terminal
WO2021197689A1|2021-10-07|Method and device for managing multiple presses on a touch-sensitive surface
WO2014095363A1|2014-06-26|Terminal for monitoring traffic, particularly air traffic
FR2946768A1|2010-12-17|METHOD OF TACTILE INPUTTING CONTROL INSTRUCTIONS OF A COMPUTER PROGRAM AND SYSTEM FOR IMPLEMENTING SAID METHOD
WO2015082817A1|2015-06-11|Method for controlling the interaction with a touch screen and device implementing said method
KR101474552B1|2014-12-22|Method of computer running three dimensions contents display, apparatus performing the same and storage media storing the same
WO2021084198A1|2021-05-06|Method for interaction with a user of an immersive system and device for implementing such a method
同族专利:
公开号 | 公开日
CN107111449A|2017-08-29|
CN107111449B|2020-10-30|
FR3028968B1|2016-11-25|
WO2016079432A1|2016-05-26|
EP3221780A1|2017-09-27|
US20170308259A1|2017-10-26|
KR20170086103A|2017-07-25|
US10481787B2|2019-11-19|
KR102237452B1|2021-04-07|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20090201246A1|2008-02-11|2009-08-13|Apple Inc.|Motion Compensation for Screens|
EP2105826A2|2008-03-25|2009-09-30|LG Electronics Inc.|Mobile terminal and method of displaying information therein|
US20110157040A1|2009-12-24|2011-06-30|Sony Corporation|Touchpanel device, and control method and program for the device|
US20110285665A1|2010-05-18|2011-11-24|Takashi Matsumoto|Input device, input method, program, and recording medium|
US20120120002A1|2010-11-17|2012-05-17|Sony Corporation|System and method for display proximity based control of a touch screen user interface|
US20140028557A1|2011-05-16|2014-01-30|Panasonic Corporation|Display device, display control method and display control program, and input device, input assistance method and program|
FR3002052A1|2013-02-14|2014-08-15|Fogale Nanotech|METHOD AND DEVICE FOR NAVIGATING A DISPLAY SCREEN AND APPARATUS COMPRISING SUCH A NAVIGATION|
US8624836B1|2008-10-24|2014-01-07|Google Inc.|Gesture-based small device input|
US8261211B2|2009-10-01|2012-09-04|Microsoft Corporation|Monitoring pointer trajectory and modifying display interface|
CN103513886A|2013-04-27|2014-01-15|展讯通信(上海)有限公司|Touch control device and target object moving method and device of touch control device|
CN104049759A|2014-06-25|2014-09-17|华东理工大学|Instruction input and protection method integrating touch screen and behavior sensing|
BR112017007976A2|2014-10-22|2018-01-23|Telefonaktiebolaget Lm Ericsson|method and device for providing a touch-based user interface|FR3028967B1|2014-11-21|2017-12-15|Renault Sas|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT|
FR3080696B1|2018-04-25|2021-02-26|Renault Sas|LOCATION ADJUSTMENT SYSTEM IN IMMERSIVE VIRTUAL ENVIRONMENT|
US11188154B2|2018-05-30|2021-11-30|International Business Machines Corporation|Context dependent projection of holographic objects|
CN108888950B|2018-06-15|2020-01-10|腾讯科技(深圳)有限公司|Method, device and equipment for displaying game interface in terminal and storage medium|
US10795463B2|2018-10-22|2020-10-06|Deere & Company|Machine control using a touchpad|
法律状态:
2015-11-19| PLFP| Fee payment|Year of fee payment: 2 |
2016-05-27| PLSC| Publication of the preliminary search report|Effective date: 20160527 |
2016-11-18| PLFP| Fee payment|Year of fee payment: 3 |
2017-11-21| PLFP| Fee payment|Year of fee payment: 4 |
2019-11-20| PLFP| Fee payment|Year of fee payment: 6 |
2020-11-20| PLFP| Fee payment|Year of fee payment: 7 |
2021-11-22| PLFP| Fee payment|Year of fee payment: 8 |
优先权:
申请号 | 申请日 | 专利标题
FR1461287A|FR3028968B1|2014-11-21|2014-11-21|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT|FR1461287A| FR3028968B1|2014-11-21|2014-11-21|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT|
US15/528,139| US10481787B2|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element|
EP15804901.5A| EP3221780A1|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element|
CN201580072439.8A| CN107111449B|2014-11-21|2015-11-19|Graphical interface and method for managing the same during touch selection of displayed elements|
KR1020177017119A| KR102237452B1|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element|
PCT/FR2015/053125| WO2016079432A1|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element|
[返回顶部]